SYNONIMY: LANGUAGE REDUNDANCY, ENTROPY, GROWTH OF INFORMATION

نویسندگان

چکیده

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Quantifying multivariate redundancy with maximum entropy decompositions of mutual information

Williams and Beer (2010) proposed a nonnegative mutual information decomposition, based on the construction of redundancy lattices, which allows separating the information that a set of variables contains about a target variable into nonnegative components interpretable as the unique information of some variables not provided by others as well as redundant and synergistic components. However, t...

متن کامل

Frequency of Occurrence and Information Entropy of American Sign Language

American Sign Language (ASL) uses a series of hand based gestures (“signs”) as a replacement for words to allow the deaf to communicate. Previous work has shown that although it takes longer to make signs than to say the equivalent words, on average sentences can be completed in about the same time. This leaves unresolved, however, precisely why that should be the case. This paper reports a det...

متن کامل

entropy, negentropy, and information

evaluation criteria for different versions of the same database the concept of information, during its development, is connected to the concept of entropy created by the 19th century termodynamics scholars. information means, in this view point, order or negentropy. on the other hand, entropy is connected to the concepts as chaos and noise, which cause, in turn, disorder. in the present paper, ...

متن کامل

Entropy and Redundancy of Japanese Lexical and Syntactic Compound Verbs

The present study investigated Japanese lexical and syntactic compound verbs (V1þV2) using Shannon’s concept of entropy and redundancy calculated using corpora from the Mainichi Newspaper and a collection of selected novels. Comparing combinations of a V2 verb with various V1 verbs, syntactic compounds were higher in entropy than lexical ones while neither differed in redundancy. This result su...

متن کامل

Synergy, Redundancy and Common Information

We consider the problem of decomposing the total mutual information conveyed by a pair of predictor random variables about a target random variable into redundant, unique and synergistic contributions. We focus on the relationship between “redundant information” and the more familiar informationtheoretic notions of “common information.” Our main contribution is an impossibility result. We show ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Sovremennye issledovaniya sotsialnykh problem

سال: 2015

ISSN: 2218-7405

DOI: 10.12731/2218-7405-2015-10-14